Upper bounds on the relative entropy and Rényi divergence as a function of total variation distance for finite alphabets
نویسندگان
چکیده
A new upper bound on the relative entropy is derived as a function of the total variation distance for probability measures defined on a common finite alphabet. The bound improves a previously reported bound by Csiszár and Talata. It is further extended to an upper bound on the Rényi divergence of an arbitrary non-negative order (including ∞) as a function of the total variation distance.
منابع مشابه
IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Reverse Pinsker Inequalities
New upper bounds on the relative entropy are derived as a function of the total variation distance. One bound refines an inequality by Verdú for general probability measures. A second bound improves the tightness of an inequality by Csiszár and Talata for arbitrary probability measures that are defined on a common finite set. The latter result is further extended, for probability measures on a ...
متن کاملOn Reverse Pinsker Inequalities
New upper bounds on the relative entropy are derived as a function of the total variation distance. One bound refines an inequality by Verdú for general probability measures. A second bound improves the tightness of an inequality by Csiszár and Talata for arbitrary probability measures that are defined on a common finite set. The latter result is further extended, for probability measures on a ...
متن کاملConvexity/concavity of renyi entropy and α-mutual information
Entropy is well known to be Schur concave on finite alphabets. Recently, the authors have strengthened the result by showing that for any pair of probability distributions P and Q with Q majorized by P , the entropy of Q is larger than the entropy of P by the amount of relative entropy D(P ||Q). This result applies to P and Q defined on countable alphabets. This paper shows the counterpart of t...
متن کاملSome properties of Rényi entropy over countably infinite alphabets
In this paper we study certain properties of Rényi entropy functions Hα(P) on the space of discrete probability distributions with infinitely many probability masses. We prove some properties that parallel those known in the finite case. Some properties on the other hand are quite different in the infinite case, for example the (dis)continuity in P and the problem of divergence and behaviour of...
متن کامل-Divergences and Related Distances
Derivation of tight bounds on f -divergences and related distances is of interest in information theory and statistics. This paper improves some existing bounds on f -divergences. In some cases, an alternative approach leads to a simplified proof of an existing bound. Following bounds on the chi-squared divergence, an improved version of a reversed Pinsker’s inequality is derived for an arbitra...
متن کامل